Double committee adaboost

نویسندگان
چکیده

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Double-base asymmetric AdaBoost

Based on the use of different exponential bases to define class-dependent error bounds, a new and highly efficient asymmetric boosting scheme, coined as AdaBoostDB (Double-Base), is proposed. Supported by a fully theoretical derivation procedure, unlike most of the other approaches in the literature, our algorithm preserves all the formal guarantees and properties of original (cost-insensitive)...

متن کامل

‘Modest AdaBoost’ – Teaching AdaBoost to Generalize Better

Boosting is a technique of combining a set weak classifiers to form one high-performance prediction rule. Boosting was successfully applied to solve the problems of object detection, text analysis, data mining and etc. The most and widely used boosting algorithm is AdaBoost and its later more effective variations Gentle and Real AdaBoost. In this article we propose a new boosting algorithm, whi...

متن کامل

Regularizing AdaBoost

AdaBoost is an iterative algorithm to constructclassifier ensembles. It quickly achieves high accuracy by focusingon objects that are difficult to classify. Because of this, AdaBoosttends to overfit when subjected to noisy datasets. We observethat this can be partially prevented with the use of validationsets, taken from the same noisy training set. But using less thanth...

متن کامل

Explaining AdaBoost

Boosting is an approach to machine learning based on the idea of creating a highly accurate prediction rule by combining many relatively weak and inaccurate rules. The AdaBoost algorithm of Freund and Schapire was the first practical boosting algorithm, and remains one of the most widely used and studied, with applications in numerous fields. This chapter aims to review some of the many perspec...

متن کامل

Regularizing AdaBoost

Boosting methods maximize a hard classiication margin and are known as powerful techniques that do not exhibit overrtting for low noise cases. Also for noisy data boosting will try to enforce a hard margin and thereby give too much weight to outliers, which then leads to the dilemma of non-smooth ts and overrtting. Therefore we propose three algorithms to allow for soft margin classiication by ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of King Saud University - Science

سال: 2013

ISSN: 1018-3647

DOI: 10.1016/j.jksus.2012.02.001